Margin Based Dimensionality Reduction and Generalization

نویسندگان

  • Jing Peng
  • Stefan Robila
  • Wei Fan
  • Guna Seetharaman
چکیده

Linear discriminant analysis (LDA) for dimension reduction has been applied to a wide variety of problems such as face recognition. However, it has a major computational difficulty when the number of dimensions is greater than the sample size. In this paper, we propose a margin based criterion for linear dimension reduction that addresses the above problem associated with LDA. We establish an error bound for our proposed technique by showing its relation to least squares regression. In addition, there are well established numerical procedures such as semi-definite programming for optimizing the proposed criterion. We demonstrate the efficacy of our proposal and compare it against other competing techniques using a number of examples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Frugal hypothesis testing and classification

The design and analysis of decision rules using detection theory and statistical learning theory is important because decision making under uncertainty is pervasive. Three perspectives on limiting the complexity of decision rules are considered in this thesis: geometric regularization, dimensionality reduction, and quantization or clustering. Controlling complexity often reduces resource usage ...

متن کامل

Learning Through Non-linearly Supervised Dimensionality Reduction

Dimensionality reduction is a crucial ingredient of machine learning and data mining, boosting classification accuracy through the isolation of patterns via omission of noise. Nevertheless, recent studies have shown that dimensionality reduction can benefit from label information, via a joint estimation of predictors and target variables from a low-rank representation. In the light of such insp...

متن کامل

A QUADRATIC MARGIN-BASED MODEL FOR WEIGHTING FUZZY CLASSIFICATION RULES INSPIRED BY SUPPORT VECTOR MACHINES

Recently, tuning the weights of the rules in Fuzzy Rule-Base Classification Systems is researched in order to improve the accuracy of classification. In this paper, a margin-based optimization model, inspired by Support Vector Machine classifiers, is proposed to compute these fuzzy rule weights. This approach not only  considers both accuracy and generalization criteria in a single objective fu...

متن کامل

Large-margin Weakly Supervised Dimensionality Reduction

This paper studies dimensionality reduction in a weakly supervised setting, in which the preference relationship between examples is indicated by weak cues. A novel framework is proposed that integrates two aspects of the large margin principle (angle and distance), which simultaneously encourage angle consistency between preference pairs and maximize the distance between examples in preference...

متن کامل

Clausthal University of Technology IfI - 05 - 14 Clausthal - Zellerfeld 2005

We extend a recent variant of the prototype-based classifier learning vector quantization to a scheme which locally adapts relevance terms during learning. We derive explicit dimensionality-independent large-margin generalization bounds for this classifier and show that the method can be seen as margin maximizer.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010